Visual cortex entrains to sign language.

نویسندگان

  • Geoffrey Brookshire
  • Jenny Lu
  • Howard C Nusbaum
  • Susan Goldin-Meadow
  • Daniel Casasanto
چکیده

Despite immense variability across languages, people can learn to understand any human language, spoken or signed. What neural mechanisms allow people to comprehend language across sensory modalities? When people listen to speech, electrophysiological oscillations in auditory cortex entrain to slow ([Formula: see text]8 Hz) fluctuations in the acoustic envelope. Entrainment to the speech envelope may reflect mechanisms specialized for auditory perception. Alternatively, flexible entrainment may be a general-purpose cortical mechanism that optimizes sensitivity to rhythmic information regardless of modality. Here, we test these proposals by examining cortical coherence to visual information in sign language. First, we develop a metric to quantify visual change over time. We find quasiperiodic fluctuations in sign language, characterized by lower frequencies than fluctuations in speech. Next, we test for entrainment of neural oscillations to visual change in sign language, using electroencephalography (EEG) in fluent speakers of American Sign Language (ASL) as they watch videos in ASL. We find significant cortical entrainment to visual oscillations in sign language <5 Hz, peaking at [Formula: see text]1 Hz. Coherence to sign is strongest over occipital and parietal cortex, in contrast to speech, where coherence is strongest over the auditory cortex. Nonsigners also show coherence to sign language, but entrainment at frontal sites is reduced relative to fluent signers. These results demonstrate that flexible cortical entrainment to language does not depend on neural processes that are specific to auditory speech perception. Low-frequency oscillatory entrainment may reflect a general cortical mechanism that maximizes sensitivity to informational peaks in time-varying signals.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

How sensory-motor systems impact the neural organization for language: direct contrasts between spoken and signed language

To investigate the impact of sensory-motor systems on the neural organization for language, we conducted an H2 (15)O-PET study of sign and spoken word production (picture-naming) and an fMRI study of sign and audio-visual spoken language comprehension (detection of a semantically anomalous sentence) with hearing bilinguals who are native users of American Sign Language (ASL) and English. Direct...

متن کامل

Comparing the Effects of Auditory Deprivation and Sign Language within the Auditory and Visual Cortex

To investigate neural plasticity resulting from early auditory deprivation and use of American Sign Language, we measured responses to visual stimuli in deaf signers, hearing signers, and hearing nonsigners using functional magnetic resonance imaging. We examined "compensatory hypertrophy" (changes in the responsivity/size of visual cortical areas) and "cross-modal plasticity" (changes in audit...

متن کامل

Semiotic Analysis of Written Signs in the Road Sign Systems of Tehran City

Introduction: as a component of the urban landscape, road sign systems are among the most critical elements of urban environments. Generally speaking, the written signs dominate the design of these systems. These signs can also foster aesthetic and visual pleasure compellingly and innovatively. Furthermore, they perpetuate a specific image in the minds of their observers. This research seeks to...

متن کامل

Cross-modal integration and plastic changes revealed by lip movement, random-dot motion and sign languages in the hearing and deaf.

Sign language activates the auditory cortex of deaf subjects, which is evidence of cross-modal plasticity. Lip-reading (visual phonetics), which involves audio-visual integration, activates the auditory cortex of hearing subjects. To test whether audio-visual cross-modal plasticity occurs within areas involved in cross-modal integration, we used functional MRI to study seven prelingual deaf sig...

متن کامل

Motion-sensitive cortex and motion semantics in American Sign Language

Previous research indicates that motion-sensitive brain regions are engaged when comprehending motion semantics expressed by words or sentences. Using fMRI, we investigated whether such neural modulation can occur when the linguistic signal itself is visually dynamic and motion semantics is expressed by movements of the hands. Deaf and hearing users of American Sign Language (ASL) were presente...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Proceedings of the National Academy of Sciences of the United States of America

دوره 114 24  شماره 

صفحات  -

تاریخ انتشار 2017